Learning Composition Models for Phrase Embeddings
نویسندگان
چکیده
منابع مشابه
Learning Composition Models for Phrase Embeddings
Lexical embeddings can serve as useful representations for words for a variety of NLP tasks, but learning embeddings for phrases can be challenging. While separate embeddings are learned for each word, this is infeasible for every phrase. We construct phrase embeddings by learning how to compose word embeddings using features that capture phrase structure and context. We propose efficient unsup...
متن کاملLearning Phrase Embeddings from Paraphrases with GRUs
Learning phrase representations has been widely explored in many Natural Language Processing (NLP) tasks (e.g., Sentiment Analysis, Machine Translation) and has shown promising improvements. Previous studies either learn noncompositional phrase representations with general word embedding learning techniques or learn compositional phrase representations based on syntactic structures, which eithe...
متن کاملAdaptive Joint Learning of Compositional and Non-Compositional Phrase Embeddings
We present a novel method for jointly learning compositional and noncompositional phrase embeddings by adaptively weighting both types of embeddings using a compositionality scoring function. The scoring function is used to quantify the level of compositionality of each phrase, and the parameters of the function are jointly optimized with the objective for learning phrase embeddings. In experim...
متن کاملBilingually-constrained Phrase Embeddings for Machine Translation
We propose Bilingually-constrained Recursive Auto-encoders (BRAE) to learn semantic phrase embeddings (compact vector representations for phrases), which can distinguish the phrases with different semantic meanings. The BRAE is trained in a way that minimizes the semantic distance of translation equivalents and maximizes the semantic distance of nontranslation pairs simultaneously. After traini...
متن کاملBattRAE: Bidimensional Attention-Based Recursive Autoencoders for Learning Bilingual Phrase Embeddings
In this paper, we propose a bidimensional attention based recursive autoencoder (BattRAE) to integrate cues and source-target interactions at multiple levels of granularity into bilingual phrase representations. We employ recursive autoencoders to generate tree structures of phrase with embeddings at different levels of granularity (e.g., words, sub-phrases, phrases). Over these embeddings on t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Transactions of the Association for Computational Linguistics
سال: 2015
ISSN: 2307-387X
DOI: 10.1162/tacl_a_00135